Àá½Ã¸¸ ±â´Ù·Á ÁÖ¼¼¿ä. ·ÎµùÁßÀÔ´Ï´Ù.
KMID : 0381519990110010037
Korean Journal of Medical Education
1999 Volume.11 No. 1 p.37 ~ p.52
The Effect of Using Two Duplicated Examination Sites to Simulate the Same Cases on the OSCE Reliability
Park Hoon-Ki

Lee Jung-Kwon
Kim Seung-Ryong
Kim Kyung-Tai
Park Hae-Young
Abstract
If large-scale testing programs are being used, OSCE stations may be duplicated into two or more sites. There are a few studies on the reliability of OSCE with duplicated stations in Korea. The purpose of this study was to investigate the effect
of duplication on the reliability of OSCE.
At Hanyang university college of medicine, an OSCE was given to all senior medical students (91 per class) upon completion of all clinical clerkship rotations. The examination consisted of twenty one stations and eighteen cases that
represented commonly encountered problems in primary care. Each station required seven minutes for its administration, with 6 to 6.5 minutes for the student-SP or model encounter, during which the students performued a complete focused history and/or physical examination and/or procedure and/or management, and another 0.5 to 1 minute for the evaluator to feedback case-related comments. We analysed the reliability of duplication by comparing total OSCE scores and case scores between two exam sites. We also evaluated the reliability of duplicated stations from student¡¯s and professor¡¯s subjective response to the OSCE.
The findings of this study were following: 1) All 91 fourth-year students attended the OSCE. Standardized Cronbach ¥á coefficient of the OSCE was 0.67. The station scores and OSCE total scores were different between two duplication sites. 2) The total OSCE score of one site was slight higher than that of the other site (p=0.03). Of total 19 stations in which students were evaluated by staff evaluator, six stations were more advantageous to one part compared with counterpart stations other sixstations were vice versa. There was no significant sequence effect. More than half of students and one third of evaluators did not accept the reliability of duplication. Considering the fact that most students agreed to the reliability of SPs and the difference in last three year written exam scores was statistically insignificant, inter-rater reliability was the determining factors for the OSCE score and case score difference between two duplicated sites.

Conclusions: OSCE reliability can be affected by duplication of examination sites and inter-rater reliability is the most important determining factor. The results demonstrate a need for caution in the interpretation of scores obtained
from OSCE with duplicated stations.
KEYWORD
Evaluation, Reliability, OSCE, Duplication
FullTexts / Linksout information
  
Listed journal information
ÇмúÁøÈïÀç´Ü(KCI) KoreaMed